Heteroscedastic Factor Mixture Analysis

نویسندگان

  • Angela Montanari
  • Cinzia Viroli
چکیده

When data come from an unobserved heterogeneous population, common factor analysis is not appropriate to estimate the underlying constructs of interest. By replacing the traditional assumption of Gaussian distributed factors by a finite mixture of multivariate Gaussians, the unobserved heterogeneity can be modelled by latent classes. In so doing we obtain a particular factor mixture analysis with heteroscedastic components. In this paper the model is presented and a maximum likelihood estimation procedure via the EM algorithm is developed. We also show that the approach well performs as a dimensionally reduced model based clustering. Two real applications are illustrated and performances are compared to standard model based clustering methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fixed point rules for heteroscedastic Gaussian kernel-based topographic map formation

Abstract— We develop a number of fixed point rules for training homogeneous, heteroscedastic but otherwise radially-symmetric Gaussian kernel-based topographic maps. We extend the batch map algorithm to the heteroscedastic case and introduce two candidates of fixed point rules for which the end-states, i.e., after the neighborhood range has vanished, are identical to the maximum likelihood Gaus...

متن کامل

Initializing EM algorithm for univariate Gaussian, multi-component, heteroscedastic mixture models by dynamic programming partitions

In this paper we present and evaluate a methodology of estimating initial values of parameters of univariate, heteroscedastic Gaussian mixtures, on the basis of the dynamic programming algorithm for partitioning the range of observations into bins. We evaluate variants of dynamic programming method correspodnig to different scoring functions for partitioning. For both simulated and real data-se...

متن کامل

Comparison of Methods for Initializing Em Algorithm for Estimation of Parameters of Gaussian, Multi-component, Heteroscedastic Mixture Models

A basic approach to estimation of mixture model parameters is by using expectation maximization (EM) algorithm for maximizing the likelihood function. However, it is essential to provide the algorithm with proper initial conditions, as it is highly dependent on the first estimation (“guess”) of parameters of a mixture. This paper presents several different initial condition estimation methods, ...

متن کامل

Fitting a Finite Mixture Distribution to a Variable Subject to Heteroscedastic Measurement Error

We consider the case where a latent variable X cannot be observed directly and instead a variable W X U with an heteroscedastic measurement error U is observed It is assumed that the distribution of the true variable X is a mixture of normals and a type of the EM algorithm is applied to nd approxi mate ML estimates of the distribution parameters of X

متن کامل

Robust maximum likelihood training of heteroscedastic probabilistic neural networks

We consider the probabilistic neural network (PNN) that is a mixture of Gaussian basis functions having different variances. Such a Gaussian heteroscedastic PNN is more economic, in terms of the number of kernel functions required, than the Gaussian mixture PNN of a common variance. The expectation-maximisation (EM) algorithm, although a powerful technique for constructing maximum likelihood (M...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009